|
|
> Umm. Comparison and contrast. First you mention the case where you think
> it "will" work, then you mention the one where your not so certain.
A HUD is no different to a mesh with 2 triangles. Using the algorithm both
Darren and me explained it will just work.
> This is a Windows, Mac *and* Linux client. OpenGL may have something
> similar,
If not, I suspect the algorithm is pretty simple for checking ray/triangle
intersections - in fact the source for the DirectX implementation is
available in the SDK. I guess POV has something similar in the source too.
> but knowing what it is doesn't necessarily help if you don't
> know how to get from that to the final result. lol I am sure its simple
> math, but..
It's very simple. If x,y are the barycentric coordinates returned from your
triangle/ray intersection algorithm, the texture coordinates of the hit
point are:
Tu = x * V1.u + y * V2.u + (1-x-y) * V3.u
Tv = x * V1.v + y * V2.v + (1-x-y) * V3.v
Where V1.u is the texture x coordinate of vertex 1, etc.
> Someone else has already suggested adding procedural textures as well,
> as a means to bypass the issue of custom ones, in cases where its just
> damn stupid to use them, like making rocks, or any other surface that
> doesn't "need" hand drawn Photoshop images.
That sounds a good idea, sending a few hundred bytes of pixel shader code
seems more efficient than the textures. The engine on the client can then
just render the texture and create the mipmaps locally once. Or, if "live"
textures are needed, the pixel shader can just be directly used in the game,
that would allow animated textures :-)
Post a reply to this message
|
|